Necessary and Sufficient Conditions on Sparsity Pattern Recovery

نویسندگان

  • Alyson K. Fletcher
  • Sundeep Rangan
  • Vivek K. Goyal
چکیده

The problem of detecting the sparsity pattern of a k-sparse vector in Rn from m random noisy measurements is of interest in many areas such as system identification, denoising, pattern recognition, and compressed sensing. This paper addresses the scaling of the number of measurements m, with signal dimension n and sparsity-level nonzeros k, for asymptotically-reliable detection. We show a necessary condition for perfect recovery at any given SNR for all algorithms, regardless of complexity, is m = Ω(k log(n − k)) measurements. Conversely, it is shown that this scaling of Ω(k log(n − k)) measurements is sufficient for a remarkably simple “maximum correlation” estimator. Hence this scaling is optimal and does not require more sophisticated techniques such as lasso or matching pursuit. The constants for both the necessary and sufficient conditions are precisely defined in terms of the minimum-toaverage ratio of the nonzero components and the SNR. The necessary condition improves upon previous results for maximum likelihood estimation. For lasso, it also provides a necessary condition at any SNR and for low SNR improves upon previous work. The sufficient condition provides the first asymptotically-reliable detection guarantee at finite SNR. Index Terms compressed sensing, convex optimization, lasso, maximum likelihood estimation, random matrices, random projections, regression, sparse approximation, sparsity, subset selection

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Sharp Sufficient Condition for Sparsity Pattern Recovery

Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...

متن کامل

Sharp Sufficient Conditions on Exact Sparsity Pattern Recovery

Consider the n-dimensional vector y = Xβ+ ǫ, where β ∈ R has only k nonzero entries and ǫ ∈ R is a Gaussian noise. This can be viewed as a linear system with sparsity constraints, corrupted by noise. We find a non-asymptotic upper bound on the probability that the optimal decoder for β declares a wrong sparsity pattern, given any generic perturbation matrix X . In the case when X is randomly dr...

متن کامل

Resolution Limits of Sparse Coding in High Dimensions

This paper addresses the problem of sparsity pattern detection for unknown ksparse n-dimensional signals observed through m noisy, random linear measurements. Sparsity pattern recovery arises in a number of settings including statistical model selection, pattern detection, and image acquisition. The main results in this paper are necessary and sufficient conditions for asymptotically-reliable s...

متن کامل

Sharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso)

The problem of consistently estimating the sparsity pattern of a vector β ∈ R based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is...

متن کامل

Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using -Constrained Quadratic Programming (Lasso)

The problem of consistently estimating the sparsity pattern of a vector based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of -constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to esta...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/0804.1839  شماره 

صفحات  -

تاریخ انتشار 2008